Goto

Collaborating Authors

 antibody sequence





Exploring Protein Language Model Architecture-Induced Biases for Antibody Comprehension

Mengren, null, Liu, null, Zhang, Yixiang, Yiming, null, Zhang, null

arXiv.org Artificial Intelligence

Recent advances in protein language models (PLMs) have demonstrated remarkable capabilities in understanding protein sequences. However, the extent to which different model architectures capture antibody-specific biological properties remains unexplored. In this work, we systematically investigate how architectural choices in PLMs influence their ability to comprehend antibody sequence characteristics and functions. We evaluate three state-of-the-art PLMs-AntiBERTa, BioBERT, and ESM2--against a general-purpose language model (GPT-2) baseline on antibody target specificity prediction tasks. Our results demonstrate that while all PLMs achieve high classification accuracy, they exhibit distinct biases in capturing biological features such as V gene usage, somatic hypermutation patterns, and isotype information. Through attention attribution analysis, we show that antibody-specific models like AntiBERTa naturally learn to focus on complementarity-determining regions (CDRs), while general protein models benefit significantly from explicit CDR-focused training strategies. These findings provide insights into the relationship between model architecture and biological feature extraction, offering valuable guidance for future PLM development in computational antibody design.





HelixDesign-Antibody: A Scalable Production-Grade Platform for Antibody Design Built on HelixFold3

Gao, Jie, Hu, Jing, Zhang, Shanzhuo, Zhu, Kunrui, Qian, Sheng, Huang, Yueyang, Zhang, Xiaonan, Fang, Xiaomin

arXiv.org Artificial Intelligence

Antibody engineering is essential for developing therapeutics and advancing biomedical research. Traditional discovery methods often rely on time-consuming and resource-intensive experimental screening. To enhance and streamline this process, we introduce a production-grade, high-throughput platform built on HelixFold3, HelixDesign-Antibody, which utilizes the high-accuracy structure prediction model, HelixFold3. The platform facilitates the large-scale generation of antibody candidate sequences and evaluates their interaction with antigens. Integrated high-performance computing (HPC) support enables high-throughput screening, addressing challenges such as fragmented toolchains and high computational demands. Validation on multiple antigens showcases the platform's ability to generate diverse and high-quality antibodies, confirming a scaling law where exploring larger sequence spaces increases the likelihood of identifying optimal binders. This platform provides a seamless, accessible solution for large-scale antibody design and is available via the antibody design page of PaddleHelix platform.


Guided Generation for Developable Antibodies

Zhao, Siqi, Moller, Joshua, Quintero-Cadena, Porfi, van Niekerk, Lood

arXiv.org Artificial Intelligence

Therapeutic antibodies require not only high-affinity target engagement, but also favorable manufacturability, stability, and safety profiles for clinical effectiveness. These properties are collectively called `developability'. To enable a computational framework for optimizing antibody sequences for favorable developability, we introduce a guided discrete diffusion model trained on natural paired heavy- and light-chain sequences from the Observed Antibody Space (OAS) and quantitative developability measurements for 246 clinical-stage antibodies. To steer generation toward biophysically viable candidates, we integrate a Soft Value-based Decoding in Diffusion (SVDD) Module that biases sampling without compromising naturalness. In unconstrained sampling, our model reproduces global features of both the natural repertoire and approved therapeutics, and under SVDD guidance we achieve significant enrichment in predicted developability scores over unguided baselines. When combined with high-throughput developability assays, this framework enables an iterative, ML-driven pipeline for designing antibodies that satisfy binding and biophysical criteria in tandem.


Antibody Foundational Model : Ab-RoBERTa

Huh, Eunna, Lee, Hyeonsu, Shin, Hyunjin

arXiv.org Artificial Intelligence

With the growing prominence of antibody - based therapeutics, antibody engineering has gained increasing attention as a critical area of research and development. Recent progress in transformer - based protein large language models (LLMs) has demonstrated prom ising applications in protein sequence design and structural prediction. Moreover, the availability of large - scale antibody datasets such as the Observed Antibody Space (OAS) database has opened new avenues for the development of LLMs specialized for proce ssing antibody sequences . Among these, RoBERTa has demonstrated improved performance relative to BERT, while maintaining a smaller parameter count (125M) compared to the BERT - based protein model, ProtBERT (420M). This reduced model size enables more efficient deployment in antibody - related application s . However, despite the numerous advantages of the RoBERTa architecture, antibody - specific foundational models built upon it have remained inaccessible to the research community. In this study, we introduce Ab - RoBERTa, a RoBERTa - based antibody - specific LLM, which is publicly available at https://huggingface.co/mogam - ai/Ab - RoBERTa . This resource is intended to support a wide range of antibody - related research applications including paratope prediction or humanness assessment .